A stochastic subspace approach to gradient-free optimization in high dimensions

نویسندگان

چکیده

We present a stochastic descent algorithm for unconstrained optimization that is particularly efficient when the objective function slow to evaluate and gradients are not easily obtained, as in some PDE-constrained machine learning problems. The maps gradient onto low-dimensional random subspace of dimension $$\ell$$ at each iteration, similar coordinate but without restricting directional derivatives be along axes. Without requiring full gradient, this mapping can performed by computing (e.g., via forward-mode automatic differentiation). give proofs convergence expectation under various convexity assumptions well probabilistic results strong-convexity. Our method provides novel extension well-known Gaussian smoothing technique subspaces greater than one, opening doors new analysis more one derivative used iteration. also provide finite-dimensional variant special case Johnson–Lindenstrauss lemma. Experimentally, we show our compares favorably descent, smoothing, BFGS (when calculated differentiation) on problems from shape literature.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Hogwild: A Lock-Free Approach to Parallelizing Stochastic Gradient Descent

Stochastic Gradient Descent (SGD) is a popular algorithm that can achieve stateof-the-art performance on a variety of machine learning tasks. Several researchers have recently proposed schemes to parallelize SGD, but all require performancedestroying memory locking and synchronization. This work aims to show using novel theoretical analysis, algorithms, and implementation that SGD can be implem...

متن کامل

Stochastic Zeroth-order Optimization in High Dimensions

We consider the problem of optimizing a high-dimensional convex function using stochastic zeroth-order queries. Under sparsity assumptions on the gradients or function values, we present two algorithms: a successive component/feature selection algorithm and a noisy mirror descent algorithm using Lasso gradient estimates, and show that both algorithms have convergence rates that depend only loga...

متن کامل

Conjugate Gradient with Subspace Optimization

In this paper we present a variant of the conjugate gradient (CG) algorithm in which we invoke a subspace minimization subproblem on each iteration. We call this algorithm CGSO for “conjugate gradient with subspace optimization”. It is related to earlier work by Nemirovsky and Yudin. We apply the algorithm to solve unconstrained strictly convex problems. As with other CG algorithms, the update ...

متن کامل

Tractable stochastic analysis in high dimensions via robust optimization

Modern probability theory, whose foundation is based on the axioms set forth by Kolmogorov, is currently the major tool for performance analysis in stochastic systems. While it offers insights in understanding such systems, probability theory, in contrast to optimization, has not been developed with computational tractability as an objective when the dimension increases. Correspondingly, some o...

متن کامل

A Stochastic Optimization Approach to a Location-Allocation Problem of Organ Transplant Centers

Decision-making concerning thelocation of critical resource on the geographical network is important in many industries.In the healthcare system,these decisions include location of emergency and preventive care. The decisions of location play a crucial role due to determining the travel time between supply and de//////mand points and response time in emergencies.Organs are considered as highly ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Computational Optimization and Applications

سال: 2021

ISSN: ['0926-6003', '1573-2894']

DOI: https://doi.org/10.1007/s10589-021-00271-w